46 research outputs found

    Trimming Stability Selection increases variable selection robustness

    Full text link
    Contamination can severely distort an estimator unless the estimation procedure is suitably robust. This is a well-known issue and has been addressed in Robust Statistics, however, the relation of contamination and distorted variable selection has been rarely considered in literature. As for variable selection, many methods for sparse model selection have been proposed, including the Stability Selection which is a meta-algorithm based on some variable selection algorithm in order to immunize against particular data configurations. We introduce the variable selection breakdown point that quantifies the number of cases resp. cells that have to be contaminated in order to let no relevant variable be detected. We show that particular outlier configurations can completely mislead model selection and argue why even cell-wise robust methods cannot fix this problem. We combine the variable selection breakdown point with resampling, resulting in the Stability Selection breakdown point that quantifies the robustness of Stability Selection. We propose a trimmed Stability Selection which only aggregates the models with the lowest in-sample losses so that, heuristically, models computed on heavily contaminated resamples should be trimmed away. An extensive simulation study with non-robust regression and classification algorithms as well as with Sparse Least Trimmed Squares reveals both the potential of our approach to boost the model selection robustness as well as the fragility of variable selection using non-robust algorithms, even for an extremely small cell-wise contamination rate

    MUVOT - Establishing an International Vocational Training Program on the Topic of Measurement Uncertainty

    Get PDF
    Measurement results represent important information, which are necessary for evaluating and improving the quality of manufactured products and to control manufacturing processes. Furthermore, they build the basis for numerous decisions in the field of quality management, process and production automation or product development and design. Knowledge about the acquisition, evaluation and interpretation of measurement data as well as an understanding of the relevant influences on those measurement results are essential for employees working in the field of manufacturing metrology. Measurement results are always afflicted with deviations, due to a variety of causes. It follows that in order to assign a value to the reliability and quality of a measurement result its uncertainty must be determined and considered. However, employees in the field of quality management or metrology are often not familiar with methods for determination and interpretation of measurement uncertainty, because appropriate opportunities for training are missing in current vocational education. This need has led to the creation of the European project MUVoT, which will create a course for advanced vocational training in determining measurement uncertainty. The training course is based on a blended learning concept, combining self-dependent learning via a web-based platform and face-to-face workshops. This allows the adaption of individual knowledge and skills by self-controlled learning of abstract contents whilst the exercises enable the practical application of typical methods, which are generally considered as quite complex by many employees, and thus assure correct understanding. The featured Blended Learning concept facilitates the integration of the training into a workplace setting, thus the idea of Lifelong Learning is promoted in new fields of application. The curriculum and training concept for this newly developed training program have been designed such that the course can be applied internationally. To facilitate this, a harmonized scheme for course structure and contents has been defined albeit with inherent flexibility, allowing the adaptation to specific constraints

    Towards Safe and Sustainable Autonomous Vehicles Using Environmentally-Friendly Criticality Metrics

    Get PDF
    This paper presents a mathematical analysis of several criticality metrics used for evaluating the safety of autonomous vehicles (AVs) and also proposes novel environmentally-friendly metrics with the scope of facilitating their selection by future researchers who want to evaluate both safety and the environmental impact of AVs. Regarding this, first, we investigate whether the criticality metrics which are used to quantify the severeness of critical situations in autonomous driving are well-defined and work as intended. In some cases, the well-definedness or the intendedness of the metrics will be apparent, but in other cases, we will present mathematical demonstrations of these properties as well as alternative novel formulas. Additionally, we also present details regarding optimality. Secondly, we propose several novel environmentally-friendly metrics as well as a novel environmentally-friendly criticality metric that combines the safety and environmental impact in a car-following scenario. Third, we discuss the possibility of applying these criticality metrics in artificial intelligence (AI) training such as reinforcement learning (RL) where they can be used as penalty terms such as negative reward components. Finally, we propose a way to apply some of the metrics in a simple car-following scenario and show in our simulation that AVs powered by petrol emitted the most carbon emissions (54.92 g of CO2), being followed closely by diesel-powered AVs (54.67 g of CO2) and then by grid-electricity-powered AVs (31.16 g of CO2). Meanwhile, the AVs powered by electricity from a green source, such as solar energy, had 0 g of CO2 emissions, encouraging future researchers and the industry to develop more actively sustainable methods and metrics for powering and evaluating the safety and environmental impact of AVs using green energy

    Sequence optimization to reduce velocity offsets in cardiovascular magnetic resonance volume flow quantification--a multi-vendor study.

    Get PDF
    PURPOSE: Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. METHODS: Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. RESULTS: The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath-hold imaging and flow-related artefacts. CONCLUSIONS: This study showed that with current systems there was no generic protocol which resulted into acceptable flow offset values. Protocol optimization would have to be performed on a per scanner and per protocol basis. Proper optimization might make accurate (transverse) aortic flow quantification possible for most scanners. Pulmonary flow quantification would still need further (offline) correction.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are

    A Comparison of Major Arable Production Systems: An Agronomic, Environmental and Ecological Evaluation

    Get PDF
    One of the primary challenges of our time is develop sustainable farming systems that can feed the world with minimal environmental impact. Some studies argue that organic farming systems are best because these have minimal impact on the environment and are positive for biodiversity. Others argue that no-tillage systems are better because such systems save energy and preserve soil structure and quality. A third group argues that conventional farming systems are best because yield per hectare is highest. However, so far, systematic comparisons of major arable production systems are rare and often it is difficult to compare the advantages and disadvantages of farming systems in a systematic way due to differences in soil/site characteristics and management. Here we present data of the Swiss Farming Systems and Tillage Experiment (FAST), a long term experiment where the main European arable production systems (organic and conventional farming, reduced tillage and no tillage, each system with different cover crop treatments) are being compared using a factorial replicated design. A multidisciplinary team of researchers from various disciplines and organizations analysed this experiment. We show the advantages and disadvantages of the various production systems and present data on plant yield, life cycle analysis, global warming potential, soil quality, plant root microbiomes and above and below ground biodiversity. Our results demonstrate that: i) plant yield was highest in the conventional systems, ii) soil biodiversity and above ground diversity tended to be higher in organic production systems, iii) soil erosion was lowest in the absence of tillage and in organic production systems, iv) the positive effects of cover crops were highest in organic production systems and increased with reduced land use intensity, v) the global warming potential of organic farming systems was lower compared to conventional systems, and vi) root and plant microbiome varied between the farming systems with the occurrence of indicator species that were specific for individual farming practices. In a next step we compared the results of this experiment with observations from a large farmers network (60 fields) in Switzerland (see abstract by Büchi et al.) where organic, conventional and conservation agriculture were compared. The results of our trial (e.g. yield and environmental performance of the different farming systems) were largely in agreement with those observed in the farmers network. Overall, our results indicate that no farming system is best and the choice of the “best” production system depends on economic, ecological and environmental priorities

    Technology roadmap for cold-atoms based quantum inertial sensor in space

    Get PDF
    Recent developments in quantum technology have resulted in a new generation of sensors for measuring inertial quantities, such as acceleration and rotation. These sensors can exhibit unprecedented sensitivity and accuracy when operated in space, where the free-fall interrogation time can be extended at will and where the environment noise is minimal. European laboratories have played a leading role in this field by developing concepts and tools to operate these quantum sensors in relevant environment, such as parabolic flights, free-fall towers, or sounding rockets. With the recent achievement of Bose-Einstein condensation on the International Space Station, the challenge is now to reach a technology readiness level sufficiently high at both component and system levels to provide "off the shelf"payload for future generations of space missions in geodesy or fundamental physics. In this roadmap, we provide an extensive review on the status of all common parts, needs, and subsystems for the application of atom-based interferometers in space, in order to push for the development of generic technology components
    corecore